neural architecture search method
A Neural Architecture Search Method using Auxiliary Evaluation Metric based on ResNet Architecture
Wang, Shang, Tang, Huanrong, Ouyang, Jianquan
This paper proposes a neural architecture search space using ResNet as a framework, with search objectives including parameters for convolution, pooling, fully connected layers, and connectivity of the residual network. In addition to recognition accuracy, this paper uses the loss value on the validation set as a secondary objective for optimization. The experimental results demonstrate that the search space of this paper together with the optimisation approach can find competitive network architectures on the MNIST, Fashion-MNIST and CIFAR100 datasets.
- Europe > Portugal > Lisbon > Lisbon (0.06)
- Asia > China > Hunan Province (0.05)
- North America > United States > New York > New York County > New York City (0.04)
- North America > Canada > Ontario > Toronto (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Search (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Evolutionary Systems (1.00)
- Information Technology > Artificial Intelligence > Cognitive Science (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.68)
What is neural architecture search? AutoML for deep learning
Neural architecture search is the task of automatically finding one or more architectures for a neural network that will yield models with good results (low losses), relatively quickly, for a given dataset. Neural architecture search is currently an emergent area. There is a lot of research going on, there are many different approaches to the task, and there isn't a single best method generally -- or even a single best method for a specialized kind of problem such as object identification in images. Neural architecture search is an aspect of AutoML, along with feature engineering, transfer learning, and hyperparameter optimization. It's probably the hardest machine learning problem currently under active research; even the evaluation of neural architecture search methods is hard.